Liberal interventionism refers to a foreign policy belief that suggests it is necessary, sometimes by military means, for liberal democratic nations to intervene in other countries to protect human rights, promote democracy, or aid in preventing mass atrocities or conflicts. It is based on the idea that intervention can help make the world a better and safer place.